Sparse Subspace Clustering via Two-Step Reweighted L1-Minimization: Algorithm and Provable Neighbor Recovery Rates
نویسندگان
چکیده
Sparse subspace clustering (SSC) relies on sparse regression for accurate neighbor identification. Inspired by recent progress in compressive sensing, this paper proposes a new scheme SSC via two-step reweighted $\ell _{1} $ -minimization, which also generalizes -minimization algorithm introduced E. J. Candès et al. [ xmlns:xlink="http://www.w3.org/1999/xlink">The Annals of Statistics , vol. 42, no. 2, pp. 669–699, 2014] without incurring extra algorithmic complexity. To fully exploit the prior information offered computed representation vector first step, our approach places weight each component vector, and solves weighted LASSO second step. We propose data weighting rule suitable enhancing identification accuracy. Then, under formulation dual problem LASSO, we study depth theoretical recovery rates proposed scheme. Specifically, an interesting connection between locations nonzeros optimal solution to indexes active constraints is established. Afterwards, semi-random model, analytic probability lower/upper bounds various events are derived. Our results confirm that, with aid if enough, higher can produce many correct neighbors few incorrect as compared weighting. Computer simulations provided validate evidence effectiveness approach.
منابع مشابه
Improved sparse recovery thresholds with two-step reweighted ℓ1 minimization
It is well known that l1 minimization can be used to recover sufficiently sparse unknown signals from compressed linear measurements. In fact, exact thresholds on the sparsity, as a function of the ratio between the system dimensions, so that with high probability almost all sparse signals can be recovered from iid Gaussian measurements, have been computed and are referred to as “weak threshold...
متن کاملOn recovery of sparse signals via l1 minimization
This article considers constrained l1 minimization methods for the recovery of high dimensional sparse signals in three settings: noiseless, bounded error and Gaussian noise. A unified and elementary treatment is given in these noise settings for two l1 minimization methods: the Dantzig selector and l1 minimization with an l2 constraint. The results of this paper improve the existing results in...
متن کاملRecovery of High-Dimensional Sparse Signals via l1-Minimization
We consider the recovery of high-dimensional sparse signals via l 1 -minimization under mutual incoherence condition, which is shown to be sufficient for sparse signals recovery in the noiseless and noise cases. We study both l 1 -minimization under the l 2 constraint and the Dantzig selector. Using the two l 1 -minimization methods and a technical inequality, some results are obtained. They im...
متن کاملJointly Sparse Vector Recovery via Reweighted
An iterative reweighted algorithm is proposed for the recovery of jointly sparse vectors from multiple-measurement vectors (MMV). The proposed MMV algorithm is an extension of the iterative reweighted 1 algorithm for single measurement problems. The proposed algorithm (M-IRL1) is demonstrated to outperform non-reweighted MMV algorithms under noiseless measurements. A regularization of the M-IRL...
متن کاملSparse Subspace Clustering via Group Sparse Coding
We propose in this paper a novel sparse subspace clustering method that regularizes sparse subspace representation by exploiting the structural sharing between tasks and data points via group sparse coding. We derive simple, provably convergent, and computationally efficient algorithms for solving the proposed group formulations. We demonstrate the advantage of the framework on three challengin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Information Theory
سال: 2021
ISSN: ['0018-9448', '1557-9654']
DOI: https://doi.org/10.1109/tit.2020.3039114